Dynamic causal modeling analysis reveals the modulation of motor cortex and integration in superior temporal gyrus during multisensory speech perception
نویسندگان
چکیده
Abstract The processing of speech information from various sensory modalities is crucial for human communication. Both left posterior superior temporal gyrus (pSTG) and motor cortex importantly involve in the multisensory perception. However, dynamic integration primary regions to pSTG remain unclear. Here, we implemented a behavioral experiment classical McGurk effect paradigm acquired task functional magnetic resonance imaging (fMRI) data during synchronized audiovisual syllabic perception 63 normal adults. We conducted causal modeling (DCM) analysis explore cross-modal interactions among pSTG, precentral (PrG), middle (mSTG), fusiform (FuG). Bayesian model selection favored winning that included modulations connections PrG (mSTG → PrG, FuG PrG), (PrG mSTG, FuG), pSTG). Moreover, coupling strength above correlated with susceptibility. In addition, significant differences were found these between strong weak perceivers. Strong perceivers modulated less inhibitory visual influence, allowed excitatory auditory flowing into but integrated more pSTG. Taken together, our findings show interact dynamically cortices speech, support plays specifically role modulating gain salience modalities.
منابع مشابه
the analysis of the role of the speech acts theory in translating and dubbing hollywood films
از محوری ترین اثراتی که یک فیلم سینمایی ایجاد می کند دیالوگ هایی است که هنرپیش گان فیلم میگویند. به زعم یک فیلم ساز, یک شیوه متأثر نمودن مخاطب از اثر منظوره نیروی گفتارهای گوینده, مثل نیروی عاطفی, ترس آور, غم انگیز, هیجان انگیز و غیره, است. این مطالعه به بررسی این مسأله مبادرت کرده است که آیا نیروی فراگفتاری هنرپیش گان به مثابه ی اعمال گفتاری در پنج فیلم هالیوودی در نسخه های دوبله شده باز تولید...
15 صفحه اولDynamic Modeling Approaches for Audiovisual Speech Perception and Multisensory Integration
Multimodal information including auditory, visual and even haptic information is integrated during speech perception. Articulatory information provided by a talker‘s face enhances speech intelligibility in congruent and temporally coincident signals, and produces a perceptual fusion (e.g. the ―McGurk effect‖) when the auditory and visual signals are incongruent. This paper focuses on promising ...
متن کاملMultisensory speech perception without the left superior temporal sulcus
Converging evidence suggests that the left superior temporal sulcus (STS) is a critical site for multisensory integration of auditory and visual information during speech perception. We report a patient, SJ, who suffered a stroke that damaged the left tempo-parietal area, resulting in mild anomic aphasia. Structural MRI showed complete destruction of the left middle and posterior STS, as well a...
متن کاملthe stady and analysis of rice agroclimatology in lenjan
the west of esfahan province, iran, is one of the most important agricultural areas throughout the country due to the climate variability and life-giving water of zayanderood river. rice is one of the major and economic crops in this area. the most important climatic elements in agricultural activities which should be considered include temperature, relative humidity, precipitation and wind. so...
15 صفحه اولOn the tip of the tongue: Modulation of the primary motor cortex during audiovisual speech perception
Recent neurophysiological studies show that cortical brain regions involved in the planning and execution of speech gestures are also activated in processing speech sounds. These findings suggest that speech perception is in part mediated by reference to the motor actions afforded in the speech signal. Since interactions between auditory and visual modalities are beneficial in speech perception...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Cognitive Neurodynamics
سال: 2023
ISSN: ['1871-4080', '1871-4099']
DOI: https://doi.org/10.1007/s11571-023-09945-z